418 research outputs found

    Adversarial Multi-task Learning for Text Classification

    Full text link
    Neural network models have shown their promising opportunities for multi-task learning, which focus on learning the shared layers to extract the common and task-invariant features. However, in most existing approaches, the extracted shared features are prone to be contaminated by task-specific features or the noise brought by other tasks. In this paper, we propose an adversarial multi-task learning framework, alleviating the shared and private latent feature spaces from interfering with each other. We conduct extensive experiments on 16 different text classification tasks, which demonstrates the benefits of our approach. Besides, we show that the shared knowledge learned by our proposed model can be regarded as off-the-shelf knowledge and easily transferred to new tasks. The datasets of all 16 tasks are publicly available at \url{http://nlp.fudan.edu.cn/data/}Comment: Accepted by ACL201

    Dynamic Compositional Neural Networks over Tree Structure

    Full text link
    Tree-structured neural networks have proven to be effective in learning semantic representations by exploiting syntactic information. In spite of their success, most existing models suffer from the underfitting problem: they recursively use the same shared compositional function throughout the whole compositional process and lack expressive power due to inability to capture the richness of compositionality. In this paper, we address this issue by introducing the dynamic compositional neural networks over tree structure (DC-TreeNN), in which the compositional function is dynamically generated by a meta network. The role of meta-network is to capture the metaknowledge across the different compositional rules and formulate them. Experimental results on two typical tasks show the effectiveness of the proposed models.Comment: Accepted by IJCAI 201

    The weather affects air conditioner purchases to fill the energy efficiency gap

    Get PDF
    Energy efficiency improvement is often hindered by the energy efficiency gap. This paper examines the effect of short-run temperature fluctuations on the Energy Star air conditioner purchases in the United States from 2006 to 2019 using transaction-level data. Results show that the probability of purchasing an Energy Star air conditioner increases as the weekly temperature before the transaction deviates from 20–22 °C. A larger response is related to fewer cooling degree days in the previous years, higher electricity prices/income/educational levels/age/rate of owners, more common use of electricity, and stronger concern about climate change. 1 °C increase and decrease from 21 °C would lead to a reduction of total energy expenditure by 35.46 and 17.73 million dollars nationwide (0.13% and 0.06% of the annual total energy expenditure on air conditioning), respectively. Our findings have important policy implications for demand-end interventions to incorporate the potential impact of the ambient physical environment

    Development and characterization of 20 microsatellite markers in spotted sea bass (<em>Lateolabrax maculatus</em>) and cross-amplification in related species

    Get PDF
    The spotted sea bass (Lateolabrax maculatus) is an economically valuable cultured fish species in China. In this study, 20 novel polymorphic microsatellite loci of L. maculatus were isolated from genomic data and characterized using 40 wild individuals. The number of alleles and the effective number of alleles ranged from 2 to 12 (average of 5.1000) and from 1.180 to 8.000 (average of 3.3097). The observed and expected heterozygosities ranged from 0.083 to 0.875 (average of 0.4405) and from 0.153 to 0.875 (average of 0.5633), respectively. Deviation from the Hardy-Weinberg equilibrium was observed in 11 loci (P Lates calcarifer, achieved successful amplification of 16 primers. The microsatellite markers developed in this study could be used for research on genetic breeding of L. maculatus and genetic relationships among tested taxa

    Multi-Scale Self-Attention for Text Classification

    Full text link
    In this paper, we introduce the prior knowledge, multi-scale structure, into self-attention modules. We propose a Multi-Scale Transformer which uses multi-scale multi-head self-attention to capture features from different scales. Based on the linguistic perspective and the analysis of pre-trained Transformer (BERT) on a huge corpus, we further design a strategy to control the scale distribution for each layer. Results of three different kinds of tasks (21 datasets) show our Multi-Scale Transformer outperforms the standard Transformer consistently and significantly on small and moderate size datasets.Comment: Accepted in AAAI202
    corecore